169 research outputs found
Two Measures of Dependence
Two families of dependence measures between random variables are introduced.
They are based on the R\'enyi divergence of order and the relative
-entropy, respectively, and both dependence measures reduce to
Shannon's mutual information when their order is one. The first
measure shares many properties with the mutual information, including the
data-processing inequality, and can be related to the optimal error exponents
in composite hypothesis testing. The second measure does not satisfy the
data-processing inequality, but appears naturally in the context of distributed
task encoding.Comment: 40 pages; 1 figure; published in Entrop
On Multipath Fading Channels at High SNR
This paper studies the capacity of discrete-time multipath fading channels.
It is assumed that the number of paths is finite, i.e., that the channel output
is influenced by the present and by the L previous channel inputs. A
noncoherent channel model is considered where neither transmitter nor receiver
are cognizant of the fading's realization, but both are aware of its statistic.
The focus is on capacity at high signal-to-noise ratios (SNR). In particular,
the capacity pre-loglog - defined as the limiting ratio of the capacity to
loglog SNR as SNR tends to infinity - is studied. It is shown that,
irrespective of the number paths L, the capacity pre-loglog is 1.Comment: To be presented at the 2008 IEEE Symposium on Information Theory
(ISIT), Toronto, Canada; replaced with version that appears in the
proceeding
- …